- relative divergence
- Математика: относительная расходимость, относительное отклонение
Универсальный англо-русский словарь. Академик.ру. 2011.
Универсальный англо-русский словарь. Академик.ру. 2011.
Divergence De Kullback-Leibler — En théorie des probabilités et en théorie de l information, la divergence de Kullback Leibler[1] [2] (ou divergence K L ou encore Entropie relative) est une mesure de dissimilarité entre deux distributions de probabilités P et Q. Elle doit son… … Wikipédia en Français
Divergence de kullback-leibler — En théorie des probabilités et en théorie de l information, la divergence de Kullback Leibler[1] [2] (ou divergence K L ou encore Entropie relative) est une mesure de dissimilarité entre deux distributions de probabilités P et Q. Elle doit son… … Wikipédia en Français
Relative Strength Index — The Relative Strength Index (RSI) is a technical indicator used in the technical analysis of financial markets. It is intended to chart the current and historical strength or weakness of a stock or market based on the closing prices of a recent… … Wikipedia
Divergence de Kullback-Leibler — En théorie des probabilités et en théorie de l information, la divergence de Kullback Leibler[1], [2] (ou divergence K L ou encore entropie relative) est une mesure de dissimilarité entre deux distributions de probabilités P et Q. Elle doit son… … Wikipédia en Français
Divergence problem — Twenty year smoothed plots of averaged ring width (dashed) and tree ring density (thick line), averaged across all sites, and shown as standardized anomalies from a common base (1881 1940), and compared with equivalent area averages of mean… … Wikipedia
Relative strength index — The Relative Strength Index (RSI) is a financial technical analysis oscillator showing price strength by comparing upward and downward close to close movements.The RSI is popular because it is relatively easy to interpret. It was developed by J.… … Wikipedia
Entropie relative — Divergence de Kullback Leibler En théorie des probabilités et en théorie de l information, la divergence de Kullback Leibler[1] [2] (ou divergence K L ou encore Entropie relative) est une mesure de dissimilarité entre deux distributions de… … Wikipédia en Français
Kullback–Leibler divergence — In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence, information gain, relative entropy, or KLIC) is a non symmetric measure of the difference between two probability distributions P … Wikipedia
Quantum relative entropy — In quantum information theory, quantum relative entropy is a measure of distinguishability between two quantum states. It is the quantum mechanical analog of relative entropy. Motivation For simplicity, it will be assumed that all objects in the… … Wikipedia
Beam divergence — The beam divergence of an electromagnetic beam is an angular measure of the increase in beam diameter with distance from the optical aperture or antenna aperture from which the electromagnetic beam emerges. The term is relevant only in the far… … Wikipedia
Moving Average Convergence Divergence - MACD — A trend following momentum indicator that shows the relationship between two moving averages of prices. The MACD is calculated by subtracting the 26 day exponential moving average (EMA) from the 12 day EMA. A nine day EMA of the MACD, called the… … Investment dictionary